Cluster-based adaptive metric classification
نویسندگان
چکیده
منابع مشابه
Cluster-based adaptive metric classification
Introducing adaptive metric has been shown to improve the results of distance-based classification algorithms. Existing methods are often computationally intensive, either in the training or in the classification phase. We present a novel algorithm that we call Cluster-Based Adaptive Metric (CLAM) classification. It first determines the number of clusters in each class of a training set and the...
متن کاملCorrigendum to "Cluster-based adaptive metric classification" [Neurocomputing 81 (2012) 33-40]
The components analysis family of algorithms learns transformations from constraints as explained in the survey of Yang [34]: ‘‘Relevant Components Analysis (RCA) [10] learns a global linear transformation from (a set of) equivalence constraints. The learned transformation can be used directly to compute (a) distance between any two examples. Discriminative Component Analysis (DCA) and Kernel D...
متن کاملAdaptive Metric nearest Neighbor Classification
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classification method to try to minimize bias. We use a Chisqu...
متن کاملAn Adaptive Metric Machine for Pattern Classification
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with finite samples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose a locally adaptive nearest neighbor classification method to try to minimize bias. We use a Chi-sq...
متن کاملAdaptive Kernel Metric Nearest Neighbor Classification
Nearest neighbor classification assumes locally constant class conditional probabilities. This assumption becomes invalid in high dimensions due to the curse-ofdimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. We propose an adaptive nearest neighbor classification method to try to minimize bias. We use quasiconformal transformed kernels t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Neurocomputing
سال: 2012
ISSN: 0925-2312
DOI: 10.1016/j.neucom.2011.10.018